Implementation Testing Framework

Implementation testing must be completed to validate your program prior to launch. We recommend the following implementation tests.

Actions

Item

Test Recommendation

Standard Action Logging and Metadata

  • Ensure that each and every action you plan to use is configured appropriately. See Create an Action for configuration details.
  • Make a POST /users/{userId}/actions call for all actions per your program's launch plan. Typically this is one test user per group. Originate the test of that action in source system.
  • Use the GET /users/{userId}/actions API to verify each action is logged in Nitro for the test user.
  • If applicable, verify the action value and metadata name/value pairs were included. Validate all variations of action values and metadata name/value pairs that will be in scope with the program.
  • Ensure APIs are throttled under contractual RPS (Requests Per Second). See throttling recommendations.

Batch Actions

Retry

Ensure that retry logic is in place to account for any Nitro planned or unplanned maintenance.

Multiple/redundant Action Logging

A key component of actions is frequent and redundant activity. Some use cases necessitate a repeat action continuing to be sent from source systems. In other cases, you may want to only initiate the action one time from the source system.

For each action, identify whether the action can be completed multiple times back to back.

  • If this is allowed, test this functionality in the source system by initiating the action multiple times back to back (i.e. multiple comments on an article). Use the User Management view to verify that multiple action events were received by Nitro. If an action rate limit is in place on the Nitro side, continue to initiate the actions from the source system beyond the rate limit. Verify via the User Management view that the number of action events were capped according to the action rate limit.
  • If this is not allowed, test this functionality in the source system by initiating the action multiple times back to back (i.e. multiple comments on an article). Use the User Management view to verify that only the initial action event was obtained by Nitro. Note: This configuration would likely not use an action rate limit on the Nitro side.

Backdating

Note: Backdating is an optional feature only used in specific use cases where the time of the action is important, but data is delayed. Default behavior is to log actions with the current date/time.

If you are backdating actions, configure one or more missions with date ranges including start and end date. The end date should have already passed (meaning the mission is expired). Then:

  • Complete actions in the source system and pass timestamps between mission start and end date. This should complete missions, even if the mission is expired.
  • Complete actions in the source system and pass timestamps prior to the mission start, and after the mission end date. This should NOT complete missions.

In the User Management view verify that:

  • The actions logged with the correct date and time.
  • The missions completed for actions backdated within the date range.
  • The missions did not complete for actions outside the date range.
  • If applicable, points were awarded based on the mission completion and are shown within the user's points history.

Adding backdated actions can increase your active user count and may impact program costs.

Users

Item

Test Recommendation

ID

  • For any test users, ensure the user ID is showing in Nitro Studio as expected.
  • If multiple systems are involved, ensure the user ID matches across systems by logging an action in each respective system. Use the User Management view to verify that all actions from all systems are included in the user record.
  • When sending user preferences, such as name and/or photo, ensure the preferences are included in the user record.
  • If groups are involved, ensure the user is enrolled in the expected groups.
  • For award account programs:

Missions

Item

Test Recommendation

Action to Mission Completion

  • Test 20% or 10 missions, whichever is greater. We recommend testing core variations of missions to ensure adequate coverage for the configuration.
  • Test by logging an action in the source system, which should or should not complete the mission for a test user.
  • Log in as the test user to ensure the mission was completed or not completed.

Groups

  1. Create mission 1 with group 1 as a prerequisite.
  2. Create mission 2 with group 2 as a prerequisite.
  3. Access the list of missions for a test user that is in group 1. Ensure mission 1 displays, and mission 2 does not.
  4. Log the action for a user in group 1 which completes mission 1. Verify mission 1 is completed using the User Management view. Mission 2 should not be completed.
  5. Log the action for a user not in group 1 and verify the mission is not completed.

Metadata

  1. Create mission 1 with metadata configured in the rules.
  2. Create mission 2 with different metadata configured in the rules.
  3. Log an action for a test user where the metadata matches mission 1. Verify mission 1 is completed using the User Management view. Verify mission 2 is not completed since the metadata did not match.
  4. Log an action for a test user where the metadata matches mission 2. Verify mission 2 is completed using the User Management view. Verify mission 1 is not completed since the metadata did not match.

Points

  • Complete a mission for a test user. Compare the mission's points to the points awarded. Points awarded for a given user can be found in the User Management view.
  • If using a points multiplier, submit an action with an appropriate action value. Be sure that the action will complete a mission. Check the points awarded for a user to ensure that the multiplier was applied. Points awarded for a given user can be found in the User Management view.
  • If applicable, configure a daily points limit within a Point Category. Send multiple actions to complete missions beyond the limit set. Ensure that the user only receives the maximum allowed points.

Webhooks

Complete a mission for a test user. Monitor the Webhook URL to ensure points and/or mission data was received.

Note: Typically, inspecting the Webhook needs to be completed by a developer or technical contact who is involved in setting it up.

Blocks

Item

Test Recommendation

Impact

  • Verify the UI renders in the source system.
  • Verify imagery renders as expected.
  • In the leaderboard section:
    • Verify user names display as expected.
    • Verify the format displays as expected (precision/prefix/suffix).
    • If applicable, verify multiple leaderboards display as expected.
    • Verify points accumulate on the leaderboard.
    • Verify the leaderboard duration resets as expected.
  • For missions:
    • Verify mission names and descriptions display as expected.
    • If applicable, verify the mission tasks link to the correct location.
    • Complete a mission and confirm that it's removed from the eligible mission's list. Note: Repeatable missions will return to the list upon completion.
    • Partially complete a multi-action mission and confirm that the progress updates as expected.
    • Complete a mission and confirm it's shown on the Completed tab.
    • Complete a quiz and verify that the mission updates/completes correctly.
    • Log in as a user who hasn't satisfied specific prerequisites and confirm those missions do not display.
    • Confirm that hidden missions do not display.
  • For the trophy case:
    • Complete a mission that awards a badge and verify it displays in the trophy case.
    • Verify you can copy a badge.
    • Verify you can download a badge.
  • If applicable, verify the Award Account information displays and is linked to the correct Global Rewards Marketplace account.
  • If applicable, verify the On The Spot claim button is visible, you can enter a code, and the amount is applied to the user's award account balance.

Recognitions

  • Verify the UI renders in the source system.
  • Verify recognition categories display as configured and match any corresponding point maximum and minimum settings.
  • View the block as a user who has access based on group membership and confirm information displays correctly.
  • View the block as a user who does not have access based on group membership and confirm you do not have access.
  • Submit a recognition and verify the recognition logs as expected, any missions are completed, and the event displays in the list.

Site Settings

Item

Test Recommendation

Webhooks

Verify the webhook URL is input correctly. Then, proceed with mission testing scenario.

Levels

Verify level thresholds are set as expected.

Point Categories

  • Verify a point category exists for each required type (points/recognition).
  • If multiple point categories exist, verify the correct category is set as the default.

Localization

Item

Test Recommendation

Localization

  • Add a translation file for missions, leaderboards, and other elements. See Localization for details.
  • Add a translation file for each block.
  • Ensure that localized strings appear in all UI components for each language in use.

Impact+

Item

Test Recommendation

User Registration

Verify a user can register for Impact+.

SSO

Verify single-sign-on is working.

UI walk-through

  • Verify header and footer links are present and working.
  • Test links to discover any possible browser pop-up issues. Popup blockers must be disabled.
  • Verify logos are configured and display as expected.
  • Verify the theme is styled as expected.

User Preferences

Verify users have the right preferences to present the best user experience: firstName, lastName, userProfileUrl

See also

Getting started

Nitro Studio workspace